Orthogonal vs. uncorrelated least squares discriminant analysis for feature extraction

نویسندگان

  • Feiping Nie
  • Shiming Xiang
  • Yun Liu
  • Chenping Hou
  • Changshui Zhang
چکیده

In this paper, a new discriminant analysis for feature extraction is derived from the perspective of least squares regression. To obtain great discriminative power between classes, all the data points in each class are expected to be regressed to a single vector, and the basic task is to find a transformation matrix such that the squared regression error is minimized. To this end, two least squares discriminant analysis methods are developed under the orthogonal or the uncorrelated constraint. We show that the orthogonal least squares discriminant analysis is an extension to the null space linear discriminant analysis, and the uncorrelated least squares discriminant analysis is exactly equivalent to the traditional linear discriminant analysis. Comparative experiments show that the orthogonal one is more preferable for real world applications. 2011 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Characterization of a Family of Algorithms for Generalized Discriminant Analysis on Undersampled Problems

A generalized discriminant analysis based on a new optimization criterion is presented. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) when the scatter matrices are singular. An efficient algorithm for the new optimization problem is presented. The solutions to the proposed criterion form a family of algorithms for generalized LDA, which can ...

متن کامل

Gabor-Based Kernel Partial-Least-Squares Discrimination Features for Face Recognition

The paper presents a novel method for the extraction of facial features based on the Gabor-wavelet representation of face images and the kernel partial-least-squares discrimination (KPLSD) algorithm. The proposed feature-extraction method, called the Gabor-based kernel partial-least-squares discrimination (GKPLSD), is performed in two consecutive steps. In the first step a set of forty Gabor wa...

متن کامل

From Eigenspots to Fisherspots - Latent Spaces in the Nonlinear Detection of Spot Patterns in a Highly Varying Background

We present a scheme for the development of a spot detection procedure which is based on the learning of latent linear features from a training data set. Adapting ideas from face recognition to this low level feature extraction task, we suggest to learn a collection of filters from representative data that span a subspace which allows for a reliable distinction of a spot vs. the heterogeneous ba...

متن کامل

Heteroscedastic linear feature extraction based on sufficiency conditions

Classification of high-dimensional data typically requires extraction of discriminant features. This paper proposes a linear feature extractor, called whitened linear sufficient statistic (WLSS), which is based on the sufficiency conditions for heteroscedastic Gaussian distributions. WLSS approximates, in the least squares sense, an operator providing a sufficient statistic. The proposed method...

متن کامل

Matrix Rank Reduction for

Numerical techniques for data analysis and feature extraction are discussed using the framework of matrix rank reduction. The singular value decomposition (SVD) and its properties are reviewed, and the relation to Latent Semantic Indexing (LSI) and Principal Component Analysis (PCA) is described. Methods that approximate the SVD are reviewed. A few basic methods for linear regression, in partic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Pattern Recognition Letters

دوره 33  شماره 

صفحات  -

تاریخ انتشار 2012